Preprint ANL/MCS-P5112-0314 DATA STRUCTURE AND ALGORITHMS FOR RECURSIVELY LOW-RANK COMPRESSED MATRICES

نویسنده

  • JIE CHEN
چکیده

We present a data structure and several operations it supports for recursively lowrank compressed matrices; that is, the diagonal blocks of the matrix are recursively partitioned, and the off-diagonal blocks in each partition level admit a low-rank approximation. Such a compression is embraced in many linearor near-linear-time methods for kernel matrices, including the fast multipole method, the framework of hierarchical matrices, and several other variants. For this compressed representation, we develop a principled data structure that enables the design of matrix algorithms by using tree traversals and that facilitates computer implementation, especially in the parallel setting. We present three basic operations supported by the data structure—matrix-vector multiplication, matrix inversion, and determinant calculation—all of which have a strictly linear cost. These operations consequently enable the solution of other matrix problems with practical significance, such as the solution of a linear system and the computation of the diagonal of a matrix inverse. We show comprehensive experiments with various matrices and kernels to demonstrate the favorable numerical behavior and computational scaling of the algorithms.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Unifying Low-Rank Models for Visual Learning

Many problems in signal processing, machine learning and computer vision can be solved by learning low rank models from data. In computer vision, problems such as rigid structure from motion have been formulated as an optimization over subspaces with fixed rank. These hard -rank constraints have traditionally been imposed by a factorization that parameterizes subspaces as a product of two matri...

متن کامل

The Geometry of Compressed Sensing

Most developments in compressed sensing have revolved around the exploitation of signal structures that can be expressed and understood most easily using a geometrical interpretation. This geometric point of view not only underlies many of the initial theoretical developments on which much of the theory of compressed sensing is built, but has also allowed ideas to be extended to much more gener...

متن کامل

STRUMPACK meets Kernel Matrices

As the main part of my internship I investigated ways to apply STRUMPACK algorithm (fast linear solver developed by Scalable Solvers Group) to kernel matrices (coming from machine learning and non-parametric statistics applications). Effective preprocessing methods allowed up to 10 times more efficient compression of the matrix than naive application of STRUMPACK. I have also tested compressed ...

متن کامل

Robust Low-Rank Modelling on Matrices and Tensors

Robust low-rank modelling has recently emerged as a family of powerful methods for recovering the low-dimensional structure of grossly corrupted data, and has become successful in a wide range of applications in signal processing and computer vision. In principle, robust low-rank modelling focuses on decomposing a given data matrix into a low-rank and a sparse component, by minimising the rank ...

متن کامل

Structured random measurements in signal processing

Compressed sensing and its extensions have recently triggered interest in randomized signal acquisition. A key finding is that random measurements provide sparse signal reconstruction guarantees for efficient and stable algorithms with a minimal number of samples. While this was first shown for (unstructured) Gaussian random measurement matrices, applications require certain structure of the me...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014